Kernelized Support Tensor Machines

نویسندگان

  • Lifang He
  • Chun-Ta Lu
  • Guixiang Ma
  • Shen Wang
  • LinLin Shen
  • Philip S. Yu
  • Ann B. Ragin
چکیده

In the context of supervised tensor learning, preserving the structural information and exploiting the discriminative nonlinear relationships of tensor data are crucial for improving the performance of learning tasks. Based on tensor factorization theory and kernel methods, we propose a novel Kernelized Support Tensor Machine (KSTM) which integrates kernelized tensor factorization with maximum-margin criterion. Specifically, the kernelized factorization technique is introduced to approximate the tensor data in kernel space such that the complex nonlinear relationships within tensor data can be explored. Further, dual structural preserving kernels are devised to learn the nonlinear boundary between tensor data. As a result of joint optimization, the kernels obtained in KSTM exhibit better generalization power to discriminative analysis. The experimental results on realworld neuroimaging datasets show the superiority of KSTM over the state-of-the-art techniques.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Data Reduction in Support Vector Machines by a Kernelized Ionic Interaction Model

A major drawback of support vector machines is their high computational complexity. In this paper, we introduce a novel kernelized ionic interaction (IoI) model for data reduction in support vector machines. We also present a data reduction method based on the kernelized instance based (KIB2) algorithm. We show that the computation time can be significantly reduced without any significant decre...

متن کامل

The Kernelized Stochastic Batch Perceptron

We present a novel approach for training kernel Support Vector Machines, establish learning runtime guarantees for our method that are better then those of any other known kernelized SVM optimization approach, and show that our method works well in practice compared to existing alternatives.

متن کامل

Relationships Between Support Vector Classifiers and Generalized Linear Discriminant Analysis on Support Vectors

The linear discriminant analysis based on the generalized singular value decomposition (LDA/GSVD) has recently been introduced to circumvents the nonsingularity restriction that occur in the classical LDA so that a dimension reducing transformation can be effectively obtained for undersampled problems. In this paper, relationships between support vector machines (SVMs) and the generalized linea...

متن کامل

AUC Maximization with K-hyperplane

The area under the ROC curve (AUC) is a measure of interest in various machine learning and data mining applications. It has been widely used to evaluate classification performance on heavily imbalanced data. The kernelized AUC maximization machines have established a superior generalization ability compared to linear AUC machines because of their capability in modeling the complex nonlinear st...

متن کامل

Large Scale, Large Margin Classification using Indefinite Similarity Measures

Despite the success of the popular kernelized support vector machines, they have two major limitations: they are restricted to Positive Semi-Definite (PSD) kernels, and their training complexity scales at least quadratically with the size of the data. Many natural measures of similarity between pairs of samples are not PSD e.g. invariant kernels, and those that are implicitly or explicitly defi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017